1,036 research outputs found

    Development and Testing of a 2-D Transfer CCD

    Get PDF
    This paper describes the development, operation, and characterization of charge-coupled devices (CCDs) that feature an electrode structure that allows the transfer of charge both horizontally and vertically through the image area. Such devices have been termed two-dimensional (2-D) transfer CCDs (2DT CCDs), as opposed to the conventional devices, which might be called one-dimensional transfer CCDs, but in other respects are the same as conventional CCD devices. Batches of two different 2DT CCD test devices, featuring different electrode structures but with identical clocking operation in each case, were produced and tested. The methodology of 2-D charge transfer in each of the device types is described, followed by a presentation of test results from the new CCDs. The ability of both 2DT CCD transfer electrode schemes to successfully transfer charge in both horizontal and vertical directions in the image section of the devices has been proven, opening up potential new applications for 2DT CCD use

    Estimation of room acoustic parameters: the ACE challenge

    No full text
    Reverberation Time (T60) and Direct-to-Reverberant Ratio (DRR) are important parameters which together can characterize sound captured by microphones in non-anechoic rooms. These parameters are important in speech processing applications such as speech recognition and dereverberation. The values of T60 and DRR can be estimated directly from the Acoustic Impulse Response (AIR) of the room. In practice, the AIR is not normally available, in which case these parameters must be estimated blindly from the observed speech in the microphone signal. The Acoustic Characterization of Environments (ACE) Challenge aimed to determine the state-of-the-art in blind acoustic parameter estimation and also to stimulate research in this area. A summary of the ACE Challenge, and the corpus used in the challenge is presented together with an analysis of the results. Existing algorithms were submitted alongside novel contributions, the comparative results for which are presented in this paper. The challenge showed that T60 estimation is a mature field where analytical approaches dominate whilst DRR estimation is a less mature field where machine learning approaches are currently more successful

    Assessing and countering reaction attacks against post-quantum public-key cryptosystems based on QC-LDPC codes

    Full text link
    Code-based public-key cryptosystems based on QC-LDPC and QC-MDPC codes are promising post-quantum candidates to replace quantum vulnerable classical alternatives. However, a new type of attacks based on Bob's reactions have recently been introduced and appear to significantly reduce the length of the life of any keypair used in these systems. In this paper we estimate the complexity of all known reaction attacks against QC-LDPC and QC-MDPC code-based variants of the McEliece cryptosystem. We also show how the structure of the secret key and, in particular, the secret code rate affect the complexity of these attacks. It follows from our results that QC-LDPC code-based systems can indeed withstand reaction attacks, on condition that some specific decoding algorithms are used and the secret code has a sufficiently high rate.Comment: 21 pages, 2 figures, to be presented at CANS 201

    Analysis of reaction and timing attacks against cryptosystems based on sparse parity-check codes

    Full text link
    In this paper we study reaction and timing attacks against cryptosystems based on sparse parity-check codes, which encompass low-density parity-check (LDPC) codes and moderate-density parity-check (MDPC) codes. We show that the feasibility of these attacks is not strictly associated to the quasi-cyclic (QC) structure of the code but is related to the intrinsically probabilistic decoding of any sparse parity-check code. So, these attacks not only work against QC codes, but can be generalized to broader classes of codes. We provide a novel algorithm that, in the case of a QC code, allows recovering a larger amount of information than that retrievable through existing attacks and we use this algorithm to characterize new side-channel information leakages. We devise a theoretical model for the decoder that describes and justifies our results. Numerical simulations are provided that confirm the effectiveness of our approach

    IPEM code of practice for high-energy photon therapy dosimetry based on the NPL absorbed dose calibration service

    Get PDF
    The 1990 code of practice (COP), produced by the IPSM (now the Institute of Physics and Engineering in Medicine, IPEM) and the UK National Physical Laboratory (NPL), gave instructions for determining absorbed dose to water for megavoltage photon (MV) radiotherapy beams (Lillicrap et al 1990). The simplicity and clarity of the 1990 COP led to widespread uptake and high levels of consistency in external dosimetry audits. An addendum was published in 2014 to include the non-conventional conditions in Tomotherapy units. However, the 1990 COP lacked detailed recommendations for calibration conditions, and the corresponding nomenclature, to account for modern treatment units with different reference fields, including small fields as described in IAEA TRS483 (International Atomic Energy Agency (IAEA) 2017, Vienna). This updated COP recommends the irradiation geometries, the choice of ionisation chambers, appropriate correction factors and the derivation of absorbed dose to water calibration coefficients, for carrying out reference dosimetry measurements on MV external beam radiotherapy machines. It also includes worked examples of application to different conditions. The strengths of the 1990 COP are retained: recommending the NPL2611 chamber type as secondary standard; the use of tissue phantom ratio (TPR) as the beam quality specifier; and NPL-provided direct calibration coefficients for the user's chamber in a range of beam qualities similar to those in clinical use. In addition, the formalism is now extended to units that cannot achieve the standard reference field size of 10 cm × 10 cm, and recommendations are given for measuring dose in non-reference conditions. This COP is designed around the service that NPL provides and thus it does not require the range of different options presented in TRS483, such as generic correction factors for beam quality. This approach results in a significantly simpler, more concise and easier to follow protocol

    Characterizing Interdisciplinarity of Researchers and Research Topics Using Web Search Engines

    Get PDF
    Researchers' networks have been subject to active modeling and analysis. Earlier literature mostly focused on citation or co-authorship networks reconstructed from annotated scientific publication databases, which have several limitations. Recently, general-purpose web search engines have also been utilized to collect information about social networks. Here we reconstructed, using web search engines, a network representing the relatedness of researchers to their peers as well as to various research topics. Relatedness between researchers and research topics was characterized by visibility boost-increase of a researcher's visibility by focusing on a particular topic. It was observed that researchers who had high visibility boosts by the same research topic tended to be close to each other in their network. We calculated correlations between visibility boosts by research topics and researchers' interdisciplinarity at individual level (diversity of topics related to the researcher) and at social level (his/her centrality in the researchers' network). We found that visibility boosts by certain research topics were positively correlated with researchers' individual-level interdisciplinarity despite their negative correlations with the general popularity of researchers. It was also found that visibility boosts by network-related topics had positive correlations with researchers' social-level interdisciplinarity. Research topics' correlations with researchers' individual- and social-level interdisciplinarities were found to be nearly independent from each other. These findings suggest that the notion of "interdisciplinarity" of a researcher should be understood as a multi-dimensional concept that should be evaluated using multiple assessment means.Comment: 20 pages, 7 figures. Accepted for publication in PLoS On

    Automatic Network Fingerprinting through Single-Node Motifs

    Get PDF
    Complex networks have been characterised by their specific connectivity patterns (network motifs), but their building blocks can also be identified and described by node-motifs---a combination of local network features. One technique to identify single node-motifs has been presented by Costa et al. (L. D. F. Costa, F. A. Rodrigues, C. C. Hilgetag, and M. Kaiser, Europhys. Lett., 87, 1, 2009). Here, we first suggest improvements to the method including how its parameters can be determined automatically. Such automatic routines make high-throughput studies of many networks feasible. Second, the new routines are validated in different network-series. Third, we provide an example of how the method can be used to analyse network time-series. In conclusion, we provide a robust method for systematically discovering and classifying characteristic nodes of a network. In contrast to classical motif analysis, our approach can identify individual components (here: nodes) that are specific to a network. Such special nodes, as hubs before, might be found to play critical roles in real-world networks.Comment: 16 pages (4 figures) plus supporting information 8 pages (5 figures

    Double-blind randomized clinical trial of percutaneous endoscopic gastrostomy versus radiologically inserted gastrostomy in children

    Get PDF
    BACKGROUND: The aim of this RCT was to determine whether radiologically inserted gastrostomy (RIG) in children is associated with more complications than percutaneous endoscopic gastrostomy (PEG). METHODS: Children at a single tertiary children's hospital requiring a primary gastrostomy were randomized to PEG or RIG. Patients were followed by assessors blinded to the insertion method. Complications were recorded, assigned a severity score, and analysed by zero-inflated Poisson regression analysis on an intention-to-treat basis, adjusting for length of follow-up. RESULTS: Over a 3-year period, 214 children were randomized (PEG, 107; RIG, 107), of whom 100 received PEG and 96 RIG. There was no significant difference in the number of complications between PEG and RIG groups (P = 0·875), or in the complication score: patients undergoing RIG had a 1·04 (95 per cent c.i. 0·89 to 1·21) times higher complication score than those who underwent PEG (P = 0·597). Only age had an independent significant effect on complication score, with older patients having a 0·97 (0·95 to 1·00) times lower complication score per year. CONCLUSION: PEG and RIG are both safe methods of gastrostomy insertion with a low rate of major complications

    Disulfide-activated protein kinase G Iα regulates cardiac diastolic relaxation and fine-tunes the Frank-Starling response.

    Get PDF
    The Frank-Starling mechanism allows the amount of blood entering the heart from the veins to be precisely matched with the amount pumped out to the arterial circulation. As the heart fills with blood during diastole, the myocardium is stretched and oxidants are produced. Here we show that protein kinase G Iα (PKGIα) is oxidant-activated during stretch and this form of the kinase selectively phosphorylates cardiac phospholamban Ser16-a site important for diastolic relaxation. We find that hearts of Cys42Ser PKGIα knock-in (KI) mice, which are resistant to PKGIα oxidation, have diastolic dysfunction and a diminished ability to couple ventricular filling with cardiac output on a beat-to-beat basis. Intracellular calcium dynamics of ventricular myocytes isolated from KI hearts are altered in a manner consistent with impaired relaxation and contractile function. We conclude that oxidation of PKGIα during myocardial stretch is crucial for diastolic relaxation and fine-tunes the Frank-Starling response

    The impact of attrition on the representativeness of cohort studies of older people

    Get PDF
    Background: There are well-established risk factors, such as lower education, for attrition of study participants. Consequently, the representativeness of the cohort in a longitudinal study may deteriorate over time. Death is a common form of attrition in cohort studies of older people. The aim of this paper is to examine the effects of death and other forms of attrition on risk factor prevalence in the study cohort and the target population over time
    corecore